Producing information through physics alone
I have accepted the challenge from a comment on Uncommon Descent to give an account of how physics can produce information. This is going to be too long to post as a response on the same forum so I am writing it up here. I take this to mean producing information without involving living things. I believe that living things come down to physics, but to satisfy the commentator I am sure I need to demonstrate how information could arise without life. I would also add that we are presumably talking about a form of information that is found in all living things.
It is going to be a longish post because there are so many definitions of information. In fact it is going to involve some rather tedious defining of terms. I have chosen four definitions of information. Two have been defined mathematically. The other two are closer to common English usage but are less precisely defined. Three of them can be created in both living and non-living things. The fourth needs is of a type of information that can only be produced by living things – but is not itself found in living things (with very rare exceptions). The first two, the mathematical definitions, will be excruciatingly familiar to anyone who is following the ID debate.
Mathematical definitions
Definition 1: Kolmogorov complexity
I am aware that this is a very crude approximation to the large subject of algorithmic information theory but it does give a handle on one mathematical approach to information. In this approach the amount of information in a signal is measured by its Kolmogorov complexity. So a string of 20 zeroes has very little information because it can be compressed to the simple algorithm "repeat 0 20 times" – while a string of 20 random digits has high information because the only way to generate that string is to repeat it. Clearly physics generates outcomes with a lot of information in this sense all the time e.g. almost any metereological outcome – wind pressure, temperature – is sufficiently random to be full of information.
Definition 2: Dembski's information
William Dembski has defined information in terms of the improbability of an outcome meeting a specification. So a bridge hand of 13 spades is full of information because it is very improbable that a bridge hand will meet the specification – all the same suit. There is much dispute over how to come up with a specification and how to calculate the probabilities in realistic situations. Nevertheless, there are plenty of cases where physics has come up with outcomes that seem to conform to an interesting pattern and where it is extremely unlikely that would happen. One famous example is the fact that the orbits of the planets of the solar system (excluding the non-planet Pluto) are all aligned in the nearly the same plane. Common alignment seems like a rather special outcome worthy of being called specified and given that all orbital planes are equally likely (principle of indifference) the chances of them all being this aligned or closer is less than 1 in 4*10^9 – large compared to some of IDs favourite very small probabilities - but surely small enough to be called information.
Common English usage
I discussed this once before in another post but it was a long time ago – so I will start again. In ordinary English when we talk about the information that something gives us we are usually talking about what we are able to learn from it (assuming we didn't know it already). So if a newspaper or book is full of information it has lots of facts which we can glean from studying it. In this context it is important to distinguish two ways in which on object or event can convey information. They correspond to Grice's distinction between natural and unnatural meaning. To use Grice's famous example:
(1) I show Mr. X a photograph of Mr. Y displaying undue familiarity to Mrs. X. This gives some information to Mr. X – but it would do so if he found the photograph accidentally or if he saw an image that had been created completely naturally. I will call this natural information.
(2) I draw a picture of Mr. Y behaving in this manner and show it to Mr. X. This only works if I Mr. X recognises my intention in drawing the picture. Otherwise it equally easily be me doodling. I will call this is unnatural information. Virtually all information conveyed by written or spoken language is in this category. Words rely on conventions and those conventions rely on the recipient understanding the intention with which they are written or spoken. An English word spoken by a two year old who is simply copying the sound conveys no information related to the meaning of the word (it might of course convey natural information about the development of the child).
Definition 3: Natural information
Clearly physics produces outcomes full of natural information all the time. A dark cloud contains the natural information that rain is likely.
Definition 4: Unnatural information
Unnatural information clearly does require a living thing to produce it. It needs a communicator to have an intention and a recipient to understand that intention. Both of them have to be living things capable of having intentions. But, with the possible exception of a few efforts from Craig Venter, this type of information is not in life. DNA and proteins are not arranged to tell anyone anything. They worked just fine for four billion years without any person or living thing being aware of them.
These are only four of many possible definitions of information. Maybe someone can come up with a crisp definition of information that:
1) Is found in all living things
2) Cannot be found elsewhere
3) Corresponds to some currently accepted usage of the word information
0 Comments:
Post a Comment
<< Home